-
-
Notifications
You must be signed in to change notification settings - Fork 5
Proposal PR: CI-based performance testing with Docker runner #43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome to see this work. I think a lot of this would be great as incremental PRs into the existing CLI setup we have merged. Are you up for that? Are the windows issues from #45 blocking you? Would we be able to do the smaller incremental PRs for that to move that forward? Or is it something else blocking integrating this work into the existing CLI?
@@ -0,0 +1,24 @@ | |||
FROM debian:bullseye-slim |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would take a look at what the existing docker runner is doing. It uses the official node.js docker images. I think we want to stick to using supported images from either node.js, GH, or our partners like NodeSource.
https://github.com/expressjs/perf-wg/blob/main/packages/runner-docker/Dockerfile
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point, also this can be a optimization that I'm not sure but valid point.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should optimize system things, we should use what our users would use with the goal of being "close to the platform". That way we may catch perf issues from common configurations before users do (or at least can easily test them).
@@ -0,0 +1,62 @@ | |||
import { config, validateConfig } from './src/config.mjs'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a good start for a runner-local
package. Maybe take a look at what I did here and follow that pattern? https://github.com/expressjs/perf-wg/blob/main/packages/runner-docker/index.mjs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Answered in PR Comment: #43 (comment)
/** | ||
* Post a comment to a GitHub PR | ||
*/ | ||
export async function postComment(message) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we are using GHA we can just rely on their PR checks instead of making comments right? They even have an html reports: https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-commands#adding-a-job-summary
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't know that, I'll look into it, thank you.
@@ -0,0 +1,137 @@ | |||
import { readFileSync } from 'fs'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be awesome to see this as an incremental addition to the existing compare
command. Especially awesome to see it output the node version and other things you have in here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Answered in PR Comment: #43 (comment)
Right now, I have no idea, sorry 😄 , but mine target is simply:
And your target is simply:
The base conflict I see
I'm thing to create a testing folder for repo spesific testings rather than perf-wg repository setup. Whose can be combined
|
Co-authored-by: Sebastian Beltran <[email protected]>
Yes, I did this for what I think are good reasons:
Does number 3 help resolve this conflict for you? I was treating the tests in this repo as starting examples that we could move out once we are sure they are the right ones. That way, prevent having work happening all over the org's ~40 repos that we may have to update if we make core tooling changes. |
Performance Runner Proposal
Docker-based performance testing framework for JavaScript packages with CI integration and multi-Node.js version support.
Summary
Containerized test runner that provides:
Quick Setup
expf-tests/
directoryFor detailed one, read /perf-runner/readme.md.
Full Details
See the complete problem statement and technical specifications in Issue #38.